Informative Feature Disentanglement for Unsupervised Domain Adaptation

نویسندگان

چکیده

Unsupervised Domain Adaptation (UDA) aims at learning a classifier for an unlabeled target domain by transferring knowledge from labeled source with related but different distribution. The strategy of aligning the two domains in latent feature space via metric discrepancy or adversarial has achieved considerable progress. However, these existing approaches mainly focus on adapting entire image and ignore bottleneck that occurs when forced adaptation uninformative domain-specific variations undermines effectiveness learned features. To address this problem, we propose novel component called Informative Feature Disentanglement (IFD), which is equipped network model, respectively. Accordingly, new architectures, named IFDAN IFDMN, enable informative refinement before adaptation. proposed IFD designed to disentangle features variations, are produced Variational Autoencoder (VAE) lateral connections encoder decoder. We cooperatively apply conduct supervised disentanglement unsupervised domain. In way, disentangled details Extensive experimental results three gold-standard datasets, e.g., Office31, Office-Home VisDA-C, demonstrate IFDMN models UDA.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Adversarial Feature Augmentation for Unsupervised Domain Adaptation

Recent works showed that Generative Adversarial Networks (GANs) can be successfully applied in unsupervised domain adaptation, where, given a labeled source dataset and an unlabeled target dataset, the goal is to train powerful classifiers for the target samples. In particular, it was shown that a GAN objective function can be used to learn target features indistinguishable from the source ones...

متن کامل

Unsupervised Domain Adaptation with Feature Embeddings

Representation learning is the dominant technique for unsupervised domain adaptation, but existing approaches often require the specification of “pivot features” that generalize across domains, which are selected by task-specific heuristics. We show that a novel but simple feature embedding approach provides better performance, by exploiting the feature template structure common in NLP problems.

متن کامل

Unsupervised Multi-Domain Adaptation with Feature Embeddings

Representation learning is the dominant technique for unsupervised domain adaptation, but existing approaches have two major weaknesses. First, they often require the specification of “pivot features” that generalize across domains, which are selected by taskspecific heuristics. We show that a novel but simple feature embedding approach provides better performance, by exploiting the feature tem...

متن کامل

Deep Nonlinear Feature Coding for Unsupervised Domain Adaptation

Deep feature learning has recently emerged with demonstrated effectiveness in domain adaptation. In this paper, we propose a Deep Nonlinear Feature Coding framework (DNFC) for unsupervised domain adaptation. DNFC builds on the marginalized stacked denoising autoencoder (mSDA) to extract rich deep features. We introduce two new elements to mSDA: domain divergence minimization by Maximum Mean Dis...

متن کامل

Deep Unsupervised Domain Adaptation for Image Classification via Low Rank Representation Learning

Domain adaptation is a powerful technique given a wide amount of labeled data from similar attributes in different domains. In real-world applications, there is a huge number of data but almost more of them are unlabeled. It is effective in image classification where it is expensive and time-consuming to obtain adequate label data. We propose a novel method named DALRRL, which consists of deep ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Multimedia

سال: 2022

ISSN: ['1520-9210', '1941-0077']

DOI: https://doi.org/10.1109/tmm.2021.3080516